These three articles have to do with the actual importance of a college degree and whether or not it's worth it anymore. I agree with these articles to a certain extent. When discussing the idea of whether or not a college degree is as important anymore, I agree that it is not. Going to college has become to natural for so many people and has been a fact rather than a question to consider. Because of this, and the increased rate of college students, having a degree doesn't do so much for job opportunities anymore. Now I feel that going to college is more for the experience. In one of the articles it said that for most jobs you don't learn that much that it makes your job easier which you wouldn't have learned without college. This is where I disagree. I think that most successful jobs do require certain education abilities that are taught at college. I just feel that sometimes they teach you too much that you don't need.
